A low-cost variational-Bayes technique for merging mixtures of probabilistic principal component analyzers
نویسندگان
چکیده
منابع مشابه
A low-cost variational-Bayes technique for merging mixtures of probabilistic principal component analyzers
Mixtures of probabilistic principal component analyzers (MPPCA) have shown effective for modeling high-dimensional data sets living on nonlinear manifolds. Briefly stated, they conduct mixture model estimation and dimensionality reduction through a single process. This paper makes two contributions: first, we disclose a Bayesian technique for estimating such mixture models. Then, assuming sever...
متن کاملMixtures of probabilistic principal component analyzers.
Principal component analysis (PCA) is one of the most popular techniques for processing, compressing, and visualizing data, although its effectiveness is limited by its global linearity. While nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data complexity by a combination of local linear PCA projections. However, conventional PCA does not correspond to a pro...
متن کاملMixtures of robust probabilistic principal component analyzers
Mixtures of probabilistic principal component analyzers model high-dimensional nonlinear data by combining local linear models. Each mixture component is specifically designed to extract the local principal orientations in the data. An important issue with this generative model is its sensitivity to data lying off the low-dimensional manifold. In order to address this problem, the mixtures of r...
متن کاملMixtures of Principal Component Analyzers
Principal component analysis (PCA) is a ubiquitous technique for data analysis but one whose effective application is restricted by its global linear character. While global nonlinear variants of PCA have been proposed, an alternative paradigm is to capture data nonlinearity by a mixture of local PCA models. However, existing techniques are limited by the absence of a probabilistic formalism wi...
متن کاملAlmost autonomous training of mixtures of principal component analyzers
In recent years, a number of mixtures of local PCA models have been proposed. Most of these models require the user to set the number of submodels (local models) in the mixture and the dimensionality of the submodels (i.e., number of PC’s) as well. To make the model free of these parameters, we propose a greedy expectation maximization algorithm to find a suboptimal number of submodels. For a g...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Information Fusion
سال: 2013
ISSN: 1566-2535
DOI: 10.1016/j.inffus.2012.08.005